Over binary input channels, uniform distribution is a universal prior, in thesense that it allows to maximize the worst case mutual information over allbinary input channels, ensuring at least 94.2% of the capacity. In this paper,we address a similar question, but with respect to a universal generalizedlinear decoder. We look for the best collection of finitely many a posteriorimetrics, to maximize the worst case mismatched mutual information achieved bydecoding with these metrics (instead of an optimal decoder such as the MaximumLikelihood (ML) tuned to the true channel). It is shown that for binary inputand output channels, two metrics suffice to actually achieve the sameperformance as an optimal decoder. In particular, this implies that there exista decoder which is generalized linear and achieves at least 94.2% of thecompound capacity on any compound set, without the knowledge of the underlyingset.
展开▼